From:                              route@monster.com

Sent:                               Monday, October 24, 2016 10:31 AM

To:                                   hg@apeironinc.com

Subject:                          Please review this candidate for: DNS Secret

 

This resume has been forwarded to you at the request of Monster User xapeix03

udaya n pakalapati 

Last updated:  09/13/16

Job Title:  no specified

Company:  Apeiron, Inc.

Rating:  Not Rated

Screening score:  no specified

Status:  Resume Received


Sunnyvale, CA  94086
US

Quick View Links:

Resume Section

Summary Section

 

 

RESUME

  

Resume Headline: CV_Uday_Hadoop_July_2016 (1).doc

Resume Value: uekainrg92aeiure   

  

 

                              T-7328237066                                                                                

Udaya NP | Hadoop admin  | Linux Admin | Windows Adminudaynpusa@gmail.com

Professional Summary

·   Around 6+ years of experience in Hadoop with Linux  administration and Windows. Working as Hadoop admin in big data ,Working on AWS, Working on Windows , Active Directory, Exchange and Office 365 and I have used Service Now,JIRA and HPSM as ticketing tools.

 

·   Hands on experience in installing and configuring MapReduce, HDFS, HBase, Oozie, Hive, Sqoop, Pig 

 , Ambari, Zookeeper, Hue and Yarn

 

·   Installed, configured, and upgraded OS when required

 

·   Worked with structured, semi-structured, and unstructured data

 

·   Exceled in importing data from various data sources into HDFS like Oracle, DB2, and SQL

 

·   Experienced in installing and configuring Cloudera

 

·   Handled security permissions to users

 

·   Worked closely with management and other departments to configure the system correctly

 

·   Skilled in implementing fair scheduler to manage resources during peak times

 

·   Gained extensive experience managing and reviewing Hadoop Log files

 

·   Monitored the performance of the Hadoop ecosystem.

 

·   Strong technical, administration and monitoring knowledge in Bigdata/Hadoop


 

·   Strong technical, administration and monitoring knowledge in Bigdata/Hadoop

 

·   Well experienced with AWS IAM, S3, VPC, Subnets, OpsWorks, Route 53, Cloud Formation, Service Catalog and EMR Services. 

 

·   Experience in managing Red hat IPA (Identity, Policy, and Audit). 

 

·   Set up of Cluster servers on AWS and management of cluster servers. 

 

·   Excellent scripting skills in UNIX Shell and python

 

·   Enthusiastic, self-starter, eager to meet challenges and quickly assimilate latest technologies, skills, concepts and ideas.

 

  JOB SUMMARY

CLIENT

ROLE

DURATION

Comcast ,Sunnyvale ,California                                                

 

Microsoft, Seattle (Mindtre Ltd)

 

Hadoop Admin

 

Senior Engineer

 

March 2016 to till date

 

Sept’ 13 – Jan 16

 

FRHI,San Jose (ITC Infotech)

 

IT consultant

      Oct – Sept’13

Torry Harris Business Solutions Systems

,India

Sr. Server Admin

    Oct ’10  – July 12

 

 

 

 

 

 

 

Technical Skills Summary

·   Hadoop Eco-Systems Hive, Pig, Flume, Oozie, Sqoop, Spark, Impala and HBase

·   Operating systems:            RedHat Linux 5.X, 6.X, Windows 95, 98, NT, 2000, Windows Vista, 7

·   Configuration Management Tools Puppet

·   Database Oracle (SQL) 10g, MYSQL, SQL SERVER 2008

·   Hadoop Configuration Management Cloudera Manager, Ambari

·   Monitoring Tools Ganglia, Nagios

·   Scripting Languages:         Shell scripting, PowerShell.

·   Configuration / Protocol:  DNS, DHCP, WINS, VPN, TCP/IP, SNMP, IMAP, POP3, SMTP, PKI, DFS       

·   Ticketing Systems:           Remedy, Service Now, IBM Tivoli

·   Backup software’s:           Net-Backup,Tivoli, Com vault, NT Backup, DPM 2012

 

 

 

 

 

 

 

 

 

 

 

 

Professional Experience:-

Client/Organization: Comcast ,Sunnyvale ,California

Period: March 2016 – Till date

Position: Hadoop Admin

·   Setup Hadoop  MapR Cluster in Aws and in on prem Datacentre.

·   Creating data Pipe line

·   Working on Aws for Oracle BDD and Hadoop Cloudera and MapR

·   Co-ordinationg with Dev team and maintaining the Cluster

·   Hive,Hue,Drill configuration on Mapr

·   Analyzed Hadoop cluster and other big data analysis tools including Pig

·   Implemented multiple nodes on CDH3 Hadoop and Horton works cluster on Red hat Linux

·   Built a scalable distributed data solution

·   Imported data from Linux file system to HDFS

·   Loaded data from UNIX to HDFS

·   Installed clusters, starting and stopping data nodes, and recovered name nodes

·   Assisted with capacity planning and slot configuration

·   Worked on managing of Ambari, Zookeeper, Hue and Yarn

·   Created tables and views in Teradata

·   Created HBase tables to house data from different sources

·   Transmitted data from SQL to HBase using Sqoop

·   Worked with a team to successfully tune Pig's performance queries

·   Exceled in managing and reviewing Hadoop log file

·   Fitted Oozie to run multiple Hive and Pig jobs

·   Worked with management to determine the optimal way to report on data sets

·   Installed, configured, and monitored Hadoop Clusters using Cloudera

·   Installed, upgraded, and patched ecosystem products using Cloudera Manager

·   Balanced and tuned HDFS, Hive, Impala, MapReduce, and Oozie work flows

·   Maintained and backed up meta-data

·   Configured Kerberos for the clusters

·   Used data integration tools like Flume and Sqoop

·   Setup automated processes to analysis the system and find errors

·   Supported IT department in cluster hardware upgrades

·   Environment: Hadoop, HDFS, Pig, Sqoop, HBase, Shell Scripting, Ubuntu, Linux Red Hat

 

 

 

 

 

 

Professional Experience:-

Client/Organization: Microsoft,Seattle (Mindtree)

Period: Sept 13 – Jan 15

Position: Senior Engineer

RESPONSIBILITIES:

 

·   Worked on all type of Hardware from DELL,HP and IBM.Installed Linux and Wondows OS.

·   Setup, configured, and managed security for the Cloudera Hadoop cluster

·   Worked on MapR Hadoop clusters and Horton works.

·   Used Hive and Pig to perform data analysis

·   Worked on managing of Ambari, Zookeeper, Hue and Yarn

·   Loaded log data into HDFS using Flume

·   Created multi-cluster test to test the system's performance and failover

·   Improved a high-performance cache, leading to a greater stability and improved performance

·   Built a scalable Hadoop cluster for data solution

·   Responsible for maintenance and creation of nodes

·   Managed log files, backups and capacity

·   Found and troubleshot Hadoop errors

·   Worked with other teams to decide the hardware configuration

·   Implemented cluster high availability

·   Scheduled jobs using Fair Scheduler

·   Configured alerts to find possible errors

·   Handled patches and updates

·   Worked with developers to setup a full Hadoop system on AWD

·   Well experienced with AWS IAM, S3, VPC, Subnets, OpsWorks, Route 53, Cloud Formation, Service Catalog and EMR Services. 

·   Experience in managing Red hat IPA (Identity, Policy, and Audit). 

·   Set up of Cluster servers on AWS and management of cluster servers

·   Worked on phython and shell scripting

·   Environment: HDFS CDH3, CDH4, Hbase, NOSQL, RHEL 4/5/6, Hive, Pig, Perl Scripting and AWS S3, EC2

·   Worked on power shell scripts for automation.Worked on RCA findings.

·   Worked on server Hardware issues and co-ordinated with vendor to replace the faulty Item.

·   Worked on all type of Hardware from DELL,HP and IBM.

·   Worked to configure and maintaining of DNS and DHCP.

·   Worked on VM ware issues along with Hyper-V

·   Done Technical Auditing for team mates cases and giving feed backs to team mates.

·   Worked to support Exchange server environment Windows 2007/2010

·   Worked on the O365 with Hybrid and Without Hybrid Environment.

·   Worked on Site user issues for Email,Lync and microsoft outlook issues.

·   Worked on Tickets with High Priority to low prioarity.

·   AD site to site replication checking and resolving issues.

·    

 

 

Professional Experience:-

Client/Organization: FRHI,San Jose (ITC infotech)

Period: Sep’12 – sep‘13

Position:- IT consultant

RESPONSIBILITIES:

·   Implemented multiple nodes on CDH3 Hadoop cluster on Red hat Linux

·   Imported data from Linux file system to HDFS

·   Loaded data from UNIX to HDFS

·   Use of Windows Server 2008/2012 Active Directory and Exchange Management Console.for 2003, 2007 and 2010.

·   Monitored trouble ticket queue to attend user and system calls.

·   Setup a domain and Active directory on Windows 2008  server

·   Setup a domain and Active directory on Windows 2008 server.

·   Worked as  support engineer to Support ACTIVE Directory Windows 2003/2008 and group policies and DNS,DHCP,CITRIX,VM,SCOM,SCCM and Windows server Exchange 2007/2010 with O365.

·   Experienced on configuring and installing VMware (Virtualization) and Hyper V

·   Manage and monitor all D2D and D2T backups for the enterprise.

·   Perform daily, weekly and monthly disaster recovery audits.

·   Ensure company data is stored safely and securely by performing random audits of tape storage vendor . Worked on Apache 2

·   Worked on Apache SSL configuration

·   Developed and implemented policies and procedures for computer systems operations and development also ensured the technology is accessible and equipped with current hardware and software .

·   Provision and maintain server infrastructure services such as Active Directory (AD), DHCP, DNS, authentication, and network services

·   Experience in deployment and management of complex Active Directory (AD) environment

·   Hands-on knowledge of Active Director Federation Services 2.0 (ADFS 2.0)

·   Experience with integration ADFS and/or OpenAM into Azure AD

·   Scripting experience in PowerShell

·   Setup whole root zones/containers on Solaris 10 for application management; modify zone setting,

·   network settings, import file systems, migrating zone paths and migrating zones between Solaris 10 Servers.

·   Created and configured volumes on Solaris 10 systems Configured and managed storage volumes such as LVM for RHEL/centos systems.

·   Installed and configured ISCSI Utility on RHEL/Centos 6.4 server for Network Attached Storage.

·   Configure apache web server on Solaris 10: install and configure samba server for quick publishing  using third-party web page maker.

·   Install and configured LAMP on RHEL/Centos servers.

·   Have average experience with chef configuration manager on RHEL/centos servers.

·   Built a scalable distributed data solution

 

 

 

Client/Organization: THBS

Period: Oct ’10  – July 2012

Position:- Server Admin

 

RESPONSIBILITIES:

 

·   Installed Big data eco system components and monitored.

·   Worked on Cloudera HDFS.

·   Use of  Windows Server 2008/2012 Active Directory and Exchange Management Console.for 2003, 2007 and 2010.

·    

·   Provided support for virtual server solutions including VMWare and ESX hosts, adding disk space from SAN and expanding partitions using diskpart and extpart. Server patching and windows server clustering set up.

·   Support of client backups including Symantec local, Tivoli and commvault, creation of new backup jobs for new servers.

·   Remote support of clients using Kaseya, Secret Server and Join.Me

·   Managed Citrix and Xenapp environments.

·   Maintenance of multiple Symantec End Point anti-virus servers at multiple client sites.

·   Daily check of backups of Symantec Backup Exec at office locations. Set up and maintain tape rotation schedule, creation of new backup jobs for new servers. Set up and maintenance of Data Domain storage and replication system.

·   Building new Windows 2003 and 2008 servers, added systems to What’s Up Gold SNMP tracking system, Symantec Backup and McAfee anti-virus software.

·   Built HP blades servers using ILO and virtual servers using VMWare vSphere 4.1.

·   Daily use of Active Directory and Exchange Management Console.

·   Documented build procedures for physical and virtual servers. 

·   Used Automate IT software to provide replication of directories Datacenter servers.

·   Maintenance of McAfee anti-virus system including maintaining 4 repositories and compliance for 200 servers and 600 workstations. 

·   Provide daily support for Help Desk tickets on Aldon tracking system.

·   Monitor and analyze servers and resolve any problems, maintain systems reporting, tuning.

·   Created users, manage user permissions, maintain User & File system quota on Linux servers.

·   Configured volume groups and logical volumes, extended logical volumes for file system growth needs using Logical Volume Manager (LVM) commands.

·   Maintaining integrity of security with the use of group policies across domains.

·   Supporting users through email, on call and troubleshooting.

·   Maintaining inventory of all components including systems and other hardware.

·   Performed User Account management, data backups, and users' logon support.

·   Maintaining user's data backup by creating particular user folder in File Server and applying security permission on folders.

·   Monitored trouble ticket queue to attend user and system calls.

·   Attended team meetings, change control meetings to update installation progress and for upcoming changes in environment.

·   Imported data from Linux file system to HDFS

·   Loaded data from UNIX to HDFS

·   Perform day-to-day Technical Support, analyzing troubleshooting and resolved Technical problems related to servers.

·   Perform User and Security Administration and Implementing File Permissions for the Users and Groups.

·   Configuring Role-Based Access Control (RBAC) & Access Control List (ACL).

·   Maintaining Service Management Facility (SMF) in Solaris 10.

·   Resolving system Hardware and software errors and crashes, huge file sizes, file System full error on UNIX and Linux Servers.

·   Responsible for Package and Patch Management & Installation on servers.

·   Extensively worked on administering NFS, DNS, DHCP, NIS, NIS+, LDAP, Mail Servers and Samba server.

·   Installing, configuring and maintaining Web logic Server, Web Sphere Application Server, Apache/ Tomcat web servers on UNIX.

·   Encapsulating and Mirroring the Root Disk. Implementing RAID 0, RAID 1, RAID 0+1 & RAID 5 on multiple disks using Solaris Volume Manager.

·   Increasing and Decreasing the Size of File System using  Logical volume Manager.

·   Provided network and servers development and support


* Developed and implemented the Unix part of the IT infrastructure for the company's customers 
* Utilized Hyper-V to deploy and configure Linux Servers 
* Supported and administered Linux servers for the company's customers 
* Implemented sophisticated mail system with outgoing IP spreading depending on sender domains

 

EDUCATION SUMMARY

·   Bachelors in Electronics and communications from JNTU,Hyderabad, India - 2007

 

Certifications

·   Microsoft certified Professional in 2014 

·   MapR Hadoop admin  2016

 



Additional Info

BACK TO TOP

 

Current Career Level:

Experienced (Non-Manager)

 

 

Target Company:

Company Size: